Learning Permutation-Invariant Embeddings for Description Logic Concepts

نویسندگان

چکیده

Concept learning deals with description logic concepts from a background knowledge and input examples. The goal is to learn concept that covers all positive examples, while not covering any negative This non-trivial task often formulated as search problem within an infinite quasi-ordered space. Although state-of-the-art models have been successfully applied tackle this problem, their large-scale applications severely hindered due excessive exploration incurring impractical runtimes. Here, we propose remedy for limitation. We reformulate the multi-label classification neural embedding model (NERO) learns permutation-invariant embeddings sets of examples tailored towards predicting $$F_1$$ scores pre-selected concepts. By ranking such in descending order predicted scores, possible can be detected few retrieval operations, i.e., no exploration. Importantly, top-ranked used start procedure symbolic multiple advantageous regions space, rather than starting it most general $$\top $$ . Our experiments on 5 benchmark datasets 770 problems firmly suggest NERO significantly (p-value $$<1\%$$ ) outperforms terms score, number explored concepts, total runtime. provide open-source implementation our approach ( https://github.com/dice-group/Nero ).

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Multiple Description Logics Concepts

Description logics based languages have became the standard representation scheme for ontologies. They formalize the domain knowledge using interrelated concepts, contained in terminologies. The manual definition of terminologies is an expensive and error prone task, therefore automatic learning methods are a necessity. In this paper we lay the foundations of a multiple concept learning method ...

متن کامل

Learning Concept Embeddings for Efficient Bag-of-Concepts Densification

Explicit concept space models have proven efficacy for text representation in many natural language and text mining applications. The idea is to embed textual structures into a semantic space of concepts which captures the main topics of these structures. That so called bag-of-concepts representation suffers from data sparsity causing low similarity scores between similar texts due to low conce...

متن کامل

DL-Learner: Learning Concepts in Description Logics

In this paper, we introduce DL-Learner, a framework for learning in description logics and OWL. OWL is the official W3C standard ontology language for the Semantic Web. Concepts in this language can be learned for constructing and maintaining OWL ontologies or for solving problems similar to those in Inductive Logic Programming. DL-Learner includes several learning algorithms, support for diffe...

متن کامل

Translation Invariant Word Embeddings

This work focuses on the task of finding latent vector representations of the words in a corpus. In particular, we address the issue of what to do when there are multiple languages in the corpus. Prior work has, among other techniques, used canonical correlation analysis to project pre-trained vectors in two languages into a common space. We propose a simple and scalable method that is inspired...

متن کامل

Permutation invariant lattices

We say that a Euclidean lattice in Rn is permutation invariant if its automorphism group has non-trivial intersection with the symmetric group Sn, i.e., if the lattice is closed under the action of some non-identity elements of Sn. Given a fixed element τ ∈ Sn, we study properties of the set of all lattices closed under the action of τ : we call such lattices τ -invariant. These lattices natura...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2023

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-30047-9_9